Goto

Collaborating Authors

 eliciting categorical data


Reviews: Eliciting Categorical Data for Optimal Aggregation

Neural Information Processing Systems

The problem setting would be a good contribution to the literature on crowdsourcing. However, I am not sure that paper is ready for publication for the following reasons: 1) the theoretical part looks not solid, 2) the proposed algorithm (HA) looks not grounded, 3) the results of experiments are not significant. These points are supported below. Lemmas 3,4 are reasonable, however, they cover only very special cases. Specifically, Lemma 3 considers only one agent and Lemma 4 assumes that all agents have the same amount of information (they observed exactly n samples).


Eliciting Categorical Data for Optimal Aggregation

Ho, Chien-Ju, Frongillo, Rafael, Chen, Yiling

Neural Information Processing Systems

Models for collecting and aggregating categorical data on crowdsourcing platforms typically fall into two broad categories: those assuming agents honest and consistent but with heterogeneous error rates, and those assuming agents strategic and seek to maximize their expected reward. The former often leads to tractable aggregation of elicited data, while the latter usually focuses on optimal elicitation and does not consider aggregation. In this paper, we develop a Bayesian model, wherein agents have differing quality of information, but also respond to incentives. Our model generalizes both categories and enables the joint exploration of optimal elicitation and aggregation. This model enables our exploration, both analytically and experimentally, of optimal aggregation of categorical data and optimal multiple-choice interface design.